634 research outputs found

    A Comparison of Paper Sketch and Interactive Wireframe by Eye Movements Analysis, Survey, and Interview

    Get PDF
    Eye movement-based analyses have been extensively performed on graphical user interface designs, mainly on high-fidelity prototypes such as coded prototypes. However, practitioners usually initiate the development life cycle with low-fidelity prototypes, such as mock-ups or sketches. Since little or no eye movement analysis has been performed on the latter, would eye tracking transpose its benefits from high- to low-fidelity prototypes and produce different results? To bridge this gap, we performed an eye movement-based analysis that compares gaze point indexes, gaze event types and durations, fixation, and saccade indexes produced by N=8N{=}8 participants between two treatments, a paper prototype vs. a wireframe. The paper also reports a qualitative analysis based on the answers provided by these participants in a semi-directed interview and on a perceived usability questionnaire with 14 items. Due to its interactivity, the wireframe seems to foster a more exploratory approach to design (e.g., testing and navigating more extensively) than the paper prototype

    Measuring User Experience of Adaptive User Interfaces using EEG: A Replication Study

    Full text link
    Adaptive user interfaces have the advantage of being able to dynamically change their aspect and/or behaviour depending on the characteristics of the context of use, i.e. to improve user experience(UX). UX is an important quality factor that has been primarily evaluated with classical measures but to a lesser extent with physiological measures, such as emotion recognition, skin response, or brain activity.In a previous exploratory experiment involving users with different profiles and a wide range of ages, we analysed user experience in terms of cognitive load, engagement, attraction and memorisation when employing twenty graphical adaptive menus through the use of an Electroencephalogram (EEG) device. The results indicated that there were statistically significant differences for these four variables. However, we considered that it was necessary to confirm or reject these findings using a more homogeneous group of users.We conducted an operational internal replication study with 40 participants. We also investigated the potential correlation between EEG signals and the participants' user experience ratings, such as their preferences.The results of this experiment confirm that there are statistically significant differences between the EEG variables when the participants interact with the different adaptive menus. Moreover, there is a high correlation among the participants' UX ratings and the EEG signals, and a trend regarding performance has emerged from our analysis.These findings suggest that EEG signals could be used to evaluate UX. With regard to the menus studied, our results suggest that graphical menus with different structures and font types produce more differences in users' brain responses, while menus which use colours produce more similarities in users' brain responses. Several insights with which to improve users' experience of graphical adaptive menus are outlined.Comment: 10 pages, 4 figures, 2 tables, 34 references, International Conference on Evaluation and Assessment in Software Engineering (EASE '23

    Brave New GES World:A Systematic Literature Review of Gestures and Referents in Gesture Elicitation Studies

    Get PDF
    How to determine highly effective and intuitive gesture sets for interactive systems tailored to end users’ preferences? A substantial body of knowledge is available on this topic, among which gesture elicitation studies stand out distinctively. In these studies, end users are invited to propose gestures for specific referents, which are the functions to control for an interactive system. The vast majority of gesture elicitation studies conclude with a consensus gesture set identified following a process of consensus or agreement analysis. However, the information about specific gesture sets determined for specific applications is scattered across a wide landscape of disconnected scientific publications, which poses challenges to researchers and practitioners to effectively harness this body of knowledge. To address this challenge, we conducted a systematic literature review and examined a corpus of N=267 studies encompassing a total of 187, 265 gestures elicited from 6, 659 participants for 4, 106 referents. To understand similarities in users’ gesture preferences within this extensive dataset, we analyzed a sample of 2, 304 gestures extracted from the studies identified in our literature review. Our approach consisted of (i) identifying the context of use represented by end users, devices, platforms, and gesture sensing technology, (ii) categorizing the referents, (iii) classifying the gestures elicited for those referents, and (iv) cataloging the gestures based on their representation and implementation modalities. Drawing from the findings of this review, we propose guidelines for conducting future end-user gesture elicitation studies

    A Unified Model for Context-aware Adaptation of User Interfaces

    Get PDF
    The variety of contexts of use in which the interaction takes place nowadays is a challenge for both stakeholders for the development and end users for the interaction. Stakeholders either ignore the exponential amount of information coming from the context of use, adopting the inaccessible approach of one-size-fits-all, or they must dedicate a significant effort to carefully consider context differences while designing several different versions of the same user interface. For end users, web pages that are not adapted become often inaccessible in non-conventional contexts of use, with mobile devices, as smart phones and tablet PCs. In order to leverage such efforts, we propose in this paper a meta-model that by means of a unified view supports all phases of the implementation of context-aware adaptation for user interfaces. With such a formal abstraction of an interactive system, stakeholders can generate different instantiations with more concrete UI’s that can properly handle and adapt accordingly to the different constraints and characteristics of different contexts of use. We present CAMM, a meta-model for context-aware adaptation covering different application domains and also a complete adaptation lifecycle. Moreover we also present various instantiations of such a model for different scenarios of a car rental example

    Fusione Termonucleare Controllata - Il punto sulla ricerca

    Get PDF
    Adapting user interfaces to different contexts of use is essential to enhance usability. Adaptation enhances user satisfaction by meeting changing context of use requirements. However, given the variety of contexts of use, and the significant amount of involved information and contextual treatments, transformations of user interface models that consider adaptation become complex. This complexity becomes a challenge when trying to add new adaptation rules or modify the transformation. In this paper, we present “Adapt-first”, an adaptation approach intended to simplify adaptation within model based user interfaces. It capitalizes on differentiating adaptations and concretization via two transformation techniques: Concretization and translation. First-Adapt approach aims at reducing complexity and maintenance efforts of transformations from a model to another

    A Newcomer's Guide to EICS, the Engineering Interactive Computing Systems Community

    Full text link
    [EN] Welcome to EICS, the Engineering Interactive Computing Systems community, PACMHCI/EICS journal, and annual conference! In this short article, we introduce newcomers to the field and to our community with an overview of what EICS is and how it positions with respect to other venues in Human-Computer Interaction, such as CHI, UIST, and IUI, highlighting its legacy and paying homage to past scientific events from which EICS emerged. We also take this opportunity to enumerate and exemplify scientific contributions to the field of Engineering Interactive Computing Systems, which we hope to guide researchers and practitioners towards making their future PACMHCI/EICS submissions successful and impactful in the EICS community.We acknowledge the support of MetaDev2 as the main sponsor of EICS 2019. We would like to thank the Chairs of all the tracks of the EICS 2019 conference, the members of the local organization team, and the web master of the EICS 2019 web site. EICS 2019 could not have been possible without the commitment of the Programme Committee members and external reviewers. This work was partially supported by the Spanish Ministry of Economy, Industry and Competitiveness, State Research Agency / European Regional Development Fund under Vi-SMARt (TIN2016-79100-R), the Junta de Comunidades de Castilla-La Mancha European Regional Development Fund under NeUX (SBPLY/17/180501/000192) projects, the Generalitat Valenciana through project GISPRO (PROMETEO/2018/176), and the Spanish Ministry of Science and Innovation through project DataME (TIN2016-80811-P).López-Jaquero, VM.; Vatavu, R.; Panach, JI.; Pastor López, O.; Vanderdonckt, J. (2019). A Newcomer's Guide to EICS, the Engineering Interactive Computing Systems Community. Proceedings of the ACM on Human-Computer Interaction. 3:1-9. https://doi.org/10.1145/3300960S193Bastide, R., Palanque, P., & Roth, J. (Eds.). (2005). Engineering Human Computer Interaction and Interactive Systems. Lecture Notes in Computer Science. doi:10.1007/b136790Beaudouin-Lafon, M. (2004). Designing interaction, not interfaces. Proceedings of the working conference on Advanced visual interfaces - AVI ’04. doi:10.1145/989863.989865Bodart, F., & Vanderdonckt, J. (Eds.). (1996). Design, Specification and Verification of Interactive Systems ’96. Eurographics. doi:10.1007/978-3-7091-7491-3Gallud, J. A., Tesoriero, R., Vanderdonckt, J., Lozano, M., Penichet, V., & Botella, F. (2011). Distributed user interfaces. Proceedings of the 2011 annual conference extended abstracts on Human factors in computing systems - CHI EA ’11. doi:10.1145/1979742.1979576Graham, T. C. N., & Palanque, P. (Eds.). (2008). Interactive Systems. Design, Specification, and Verification. Lecture Notes in Computer Science. doi:10.1007/978-3-540-70569-7Proceedings of the 1st ACM SIGCHI symposium on Engineering interactive computing systems - EICS ’09. (2009). doi:10.1145/1570433Lawson, J.-Y. L., Vanderdonckt, J., & Vatavu, R.-D. (2018). Mass-Computer Interaction for Thousands of Users and Beyond. Extended Abstracts of the 2018 CHI Conference on Human Factors in Computing Systems. doi:10.1145/3170427.3188465Lozano, M. D., Galllud, J. A., Tesoriero, R., Penichet, V. M. R., Vanderdonckt, J., & Fardoun, H. (2013). 3rd workshop on distributed user interfaces. Proceedings of the 5th ACM SIGCHI symposium on Engineering interactive computing systems - EICS ’13. doi:10.1145/2494603.2483222Proceedings of the 2014 Workshop on Distributed User Interfaces and Multimodal Interaction - DUI ’14. (2014). doi:10.1145/2677356Proceedings of the ACM SIGCHI Symposium on Engineering Interactive Computing Systems. (2019). doi:10.1145/3319499Tesoriero, R., Lozano, M., Vanderdonckt, J., Gallud, J. A., & Penichet, V. M. R. (2012). distributed user interfaces. CHI ’12 Extended Abstracts on Human Factors in Computing Systems. doi:10.1145/2212776.2212704Vanderdonckt, J. (2005). A MDA-Compliant Environment for Developing User Interfaces of Information Systems. Active Flow and Combustion Control 2018, 16-31. doi:10.1007/11431855_2Vatavu, R.-D. (2012). User-defined gestures for free-hand TV control. Proceedings of the 10th European conference on Interactive tv and video - EuroiTV ’12. doi:10.1145/2325616.2325626Vatavu, R.-D. (2017). Beyond Features for Recognition: Human-Readable Measures to Understand Users’ Whole-Body Gesture Performance. International Journal of Human–Computer Interaction, 33(9), 713-730. doi:10.1080/10447318.2017.1278897Wobbrock, J. O., & Kientz, J. A. (2016). Research contributions in human-computer interaction. Interactions, 23(3), 38-44. doi:10.1145/290706

    HCI-E 2 : HCI Engineering Education

    Get PDF
    This workshop aims at identifying, examining, structuring and sharing educational resources and approaches to support the process of teaching/learning Human-Computer Interaction (HCI) Engineering. The broadening of the range of available interaction technologies and their applications, many times in safety and mission critical areas, to novel and less understood application domains, brings the question of how to address this ever-changing nature in university curricula usually static. Beyond, as these technologies are taught in diverse curricula (ranging from Human Factors and psychology to hardcore computer science), we are interested in what the best approaches and best practices are to integrate HCI Engineering topics in the curricula of programs in software engineering, computer science, human-computer interaction, psychology, design, etc. The workshop is proposed on behalf of the IFIP Working Groups 2.7/13.4 on User Interface Engineering and 13.1 on Education in HCI and HCI Curricula
    • 

    corecore